Goto

Collaborating Authors

 data democratization


Data Encoding For Healthcare Data Democratisation and Information Leakage Prevention

Thakur, Anshul, Zhu, Tingting, Abrol, Vinayak, Armstrong, Jacob, Wang, Yujiang, Clifton, David A.

arXiv.org Artificial Intelligence

In recent years, deep learning has demonstrated remarkable success in a wide variety of fields [1], and it is expected to have a significant impact on healthcare as well [2]. Many attempts have been made to achieve this breakthrough in healthcare informatics, which often deals with noisy, heterogeneous, and non-standardized electronic health records (EHRs) [3]. However, most clinical deep learning tools are either not robust enough or have not been tested in real-world scenarios [4, 5]. Deep learning solutions, approved by regulatory bodies, are less common in healthcare informatics, which shows that deep learning hasn't had the same level of success as in other fields such as speech and image processing [6]. Along with well-known explainability challenges in deep learning models [7], the lack of data democratization [8] and latent information leakage (information leakage from trained models) [9, 10] can also be regarded as a major hindrance in the development and acceptance of robust clinical deep learning solutions. In the current context, data democratization and information leakage can be described as: Data democratization: It involves making digital healthcare data available to a wider cohort of the AI researchers.


Using AI and Data Analytics to Monetize Data: 4 Techniques

#artificialintelligence

The economic value of data for companies is challenging to conceptualize and measure directly. Many executives have the wrong perception of data monetization. To them, the only way to derive economic value from data is to sell it to other companies. As a result, they overlook the immense untapped value that it represents. Companies can monetize by improving customer experiences, reducing costs, finding new customers, and so much more from the data that is produced directly or indirectly using big data analytics and AI.


Council Post: Why AI Projects Are Failing At Your Company

#artificialintelligence

My company, Alation, partnered with Wakefield Research to survey senior enterprise data leaders about AI. We wanted to understand how prevalent AI projects are, to understand what drives success and to more generally understand trends. Not surprisingly, we found that almost every enterprise is deploying or has plans to deploy AI. Organizations have copious data -- data that continues to grow daily -- and they have a ton of business problems that they could apply that data to. We've all seen the case studies about how AI helps drive innovation in products and services and improves operational efficiencies and customer experience.


Council Post: Four Steps To Data Democratization With Artificial Intelligence

#artificialintelligence

Data democratization should be a top concern for every company moving forward. We're approaching a point where the problem of too much data (and too few insights) can't be ignored any longer. Companies are generating more customer, employee and operational data than ever. But that data remains underleveraged and, in some cases, a liability. According to a report from Splunk, 55% of the data in every organization is "dark data," or "all the unknown and untapped data across your organization, generated by systems, devices and interactions."



The unmistakable impact of AI on agencies Federal News Network

#artificialintelligence

We are using machine learning to control situations where there are a lot of variables. Data democratization means everyone has access to these data and tools. There are a ton of great tools out there that help folks who maybe aren't data scientists, but are data science-y and make better decisions at work. The growth of artificial intelligence and machine learning over the last few years is unmistakable. Agencies have realized the potential and real benefits of using the advanced technologies to improve decision making, analyze large databases and address mission challenges.


Data Exchange and Marketplace, a New Business Model in Making

@machinelearnbot

The Internet of Things (IoT) refers to the network of numerous physical devices, also known as the Internet of objects, refers to the networked interconnection of everyday objects (20 billion by 2020, according to Gartner). Such devices will be an integral part of next-generation computing, additionally, these devices will produce astronomical data volume, catapulting us into the world of zettabytes and yottabytes. Data is a new Oil, which is a byproduct of doing operations and for others, same data can be a catalyst to capture newer insights, build AI models and drive innovation. Data Explorers and Data Miners: It would not be easy to find valuable data from massive data reservoir acquired from diverse sources. The exploration requires tremendous effort and there will be an opportunity for players and service providers who can choose an area or segment(s) and build competency.


AI and machine learning give new meaning to embedded analytics

#artificialintelligence

Data democratization, the concept of empowering any employee to make data-driven decisions for their company regardless of skill set, was supposed to rival the Elysian Fields in its paradise-worthy promise of analytics. It's easy to see why this concept made such a big splash in the enterprise. Since the early 2000's, companies have been amassing raw data, which has morphed into the $203 billion big data analytics market. But this data always lacked transparency. Housed in messy data architectures that led to siloed information, companies struggled to gain a singular big picture into what their data was telling them. IT departments were left to sort through data warehouse integrations and complex extract, transform, load processes to try and create structure so any data analytics tool would single view into solving business problems.